Capacity of Channels with Uncoded-Message Side-Information - Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
نویسنده
چکیده
Abstrac t Parallel independent channels where no encoding is allowed for one of the channels are studied. The Slepian-Wolf theorem on source coding of correlated sources is used t o show that any information source whose entropy rate is below the sum of the capacity of the coded channel and the input/output mutual information of the uncoded channel is transmissible with arbitrary reliability. The converse is also shown. Thus, coding of the side information channel is unnecessary when its mutual information is maximized by the source distribution. An information-theoretic interpretation of ParallelConcatenated channel codes and, in particular, Turbo codes is put forth.
منابع مشابه
Unit-Memory Hamming Turbo Codes - Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
متن کامل
Multilevel Diversity Coding with Symmetrical Connectivity - Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
متن کامل
The Trellis Complexity of Convolutional Codes - Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
متن کامل
On the Twisted Squaring Construction, Symmetric-Reversible Designs and Trellis Diagrams of Block Cod - Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
(32,26:1 (32.10) 28 21 19
متن کاملCapacity Definitions and Coding Strategies for General Channels with Receiver Side Information - Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on
We consider three capacity definitions for a channel with channel side information at t h e receiver. T h e capacity is the highest rate asymptotically achievable. T h e outage capacity is t h e highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is t h e highest expected rate asymptotically achievable using a single encoder a n d ...
متن کامل